112,051 research outputs found

    Auditory display for the blind

    Get PDF
    A system for providing an auditory display of two-dimensional patterns as an aid to the blind is described. It includes a scanning device for producing first and second voltages respectively indicative of the vertical and horizontal positions of the scan and a further voltage indicative of the intensity at each point of the scan and hence of the presence or absence of the pattern at that point. The voltage related to scan intensity controls transmission of the sounds to the subject so that the subject knows that a portion of the pattern is being encountered by the scan when a tone is heard, the subject determining the position of this portion of the pattern in space by the frequency and interaural difference information contained in the tone

    An introduction to interactive sonification

    Get PDF
    The research field of sonification, a subset of the topic of auditory display, has developed rapidly in recent decades. It brings together interests from the areas of data mining, exploratory data analysis, humanā€“computer interfaces, and computer music. Sonification presents information by using sound (particularly nonspeech), so that the user of an auditory display obtains a deeper understanding of the data or processes under investigation by listening

    Virtual acoustics displays

    Get PDF
    The real time acoustic display capabilities are described which were developed for the Virtual Environment Workstation (VIEW) Project at NASA-Ames. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory 'objects' or 'icons', can be designed using ACE (Auditory Cue Editor), which links both discrete and continuously varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically coordinated in real time with 3-D visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events

    Concurrent Auditory Stream Discrimination in Auditory Graphing

    Full text link
    This paper is concerned with enhancing human computer interaction and communication in concurrent streams of auditory display. Auditory display or auditory graphing is the sonic representation of numerical data (the auditory equivalent of visualization). It provides an additional channel for information representation, in which a participant 's response may be more intuitive and immediate than (visual) graphical display. but auditory graph design requires understanding and multi-disci plinary investigatio n of listening due to instantaneous characteristics of sound. Our aims are to explore (I) the impact of spatial separation for a divided attention task and (2) the efficiency of timbre (tone color) to assist pitch contour identification. Our findings about timbral and spatial discrimination are scalable and useful for auditory display in a wide variety of contexts. The results provide empirical evidence for a further investigat ion of spatialization and timbre and contribute to applications within an auditory display context for real-world scenarios (e.g. social, statistical and other datasets likely to be encountered in the workplace)

    Reconfigurable Auditory-Visual Display

    Get PDF
    System and method for visual and audible communication between a central operator and N mobile communicators (N greater than or equal to 2), including an operator transceiver and interface, configured to receive and display, for the operator, visually perceptible and audibly perceptible signals from each of the mobile communicators. The interface (1) presents an audible signal from each communicator as if the audible signal is received from a different location relative to the operator and (2) allows the operator to select, to assign priority to, and to display, the visual signals and the audible signals received from a specified communicator. Each communicator has an associated signal transmitter that is configured to transmit at least one of the visual signals and the audio signal associated with the communicator, where at least one of the signal transmitters includes at least one sensor that senses and transmits a sensor value representing a selected environmental or physiological parameter associated with the communicator

    Taxonomy and Definitions for Sonification and Auditory Display

    Get PDF
    Hermann T. Taxonomy and Definitions for Sonification and Auditory Display. In: Susini P, Warusfel O, eds. Proceedings of the 14th International Conference on Auditory Display (ICAD 2008). Paris, France: IRCAM; 2008. Sonification is still a young research field and many terms such as sonification, auditory display, auralization, audification have been used without a precise definition. Recent developments such as the introduction of Model-based Sonification, the establishing of interactive sonification and the increased interest in sonification from arts have raised the issue of revisiting the definitions towards a clearer terminology. This paper introduces a new definition for sonification and auditory display that emphasize necessary and sufficient conditions for organized sound to be called sonification. It furthermore suggests a taxonomy, and discusses the relation between visualization and sonification. A hierarchy of closed-loop interactions is furthermore introduced. This paper aims at initiating vivid discussions towards the establishing of a deeper theory of sonification and auditory display

    Embodied Cognition In Auditory Display

    Get PDF
    Presented at the 19th International Conference on Auditory Display (ICAD2013) on July 6-9, 2013 in Lodz, Poland.This paper makes a case for the use of an embodied cognition framework, based on embodied schemata and cross-domain mappings, in the design of auditory display. An overview of research that relates auditory display with embodied cognition is provided to support such a framework. It then describes research efforts towards the development this framework. By designing to support human cognitive competencies that are bound up with meaning making, it is hoped to open the door to the creation of more meaningful and intuitive auditory displays

    Neural coding of high-frequency tones

    Get PDF
    Available evidence was presented indicating that neural discharges in the auditory nerve display characteristic periodicities in response to any tonal stimulus including high-frequency stimuli, and that this periodicity corresponds to the subjective pitch

    The design of mixed-use virtual auditory displays: Recent findings with a dual-task paradigm

    Get PDF
    Presented at the 10th International Conference on Auditory Display (ICAD2004)In the third of an ongoing series of exploratory sound information display studies, we augmented a dual task with a mixed-use auditory display designed to provide relevant alert information for each task. The tasks entail a continuous tracking activity and a series of intermittent classification decisions that, in the present study, were presented on separate monitors that were roughly 90\,^{\circ} apart. Using a 2-by-3 design that manipulated both the use of sound in each task and where sounds for the decision task were positioned, the following principal questions were addressed: Can tracking performance be improved with a varying auditory alert tied to error? To what degree do listeners use virtual auditory deixis as a cue for improving decision reaction times? Can a previous finding involving participants' use of sound offsets (cessations) be repeated? And, last, are there performance consequences when auditory displays for separate tasks are combined? Respectively, we found that: Tracking performance as measured by RMS error was not improved and was apparently negatively affected by the use of our auditory design. Listener's use of even limited virtual auditory deixis is robust, but it is probably also sensitive to the degree it is coincident with the location of corresponding visual stimuli in the task environment. On the basis of manually collected head movement data, listeners do make opportunistic use of sound offsets. And, finally, a significant interaction, as measured by average participant reaction time, was observed between the auditory display used for one task and the manipulation of the degree of auditory deixis encoded in the auditory display used for the other task in our paradigm
    • ā€¦
    corecore